Understanding the message passing in graph neural networks via power iteration clustering

نویسندگان

چکیده

The mechanism of message passing in graph neural networks (GNNs) is still mysterious. Apart from convolutional networks, no theoretical origin for GNNs has been proposed. To our surprise, can be best understood terms power iteration. By fully or partly removing activation functions and layer weights GNNs, we propose subspace iteration clustering (SPIC) models that iteratively learn with only one aggregator. Experiments show extend enhance their capability to process random featured networks. Moreover, demonstrate the redundancy some state-of-the-art design define a lower limit model evaluation by aggregator passing. Our findings push boundaries understanding

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Evolutionary Clustering via Message Passing

When data are acquired at multiple points in time, evolutionary clustering can provide insights into cluster evolution and changes in cluster memberships while enabling performance superior to that obtained by independently clustering data collected at different time points. Existing evolutionary clustering methods typically require additional steps before and after the clustering stage to appr...

متن کامل

Sketched Clustering via Hybrid Approximate Message Passing

In sketched clustering, the dataset is first sketched down to a vector of modest size, from which the cluster centers are subsequently extracted. The goal is to perform clustering more efficiently than with methods that operate on the full training data, such as k-means++. For the sketching methodology recently proposed by Keriven, Gribonval, et al., which can be interpreted as a random samplin...

متن کامل

Mapping Neural Networks onto Message-Passing Multicomputers

This paper investigates the architectural requirements for simulating neural networks using massively parallel multiprocessors. First, we model the connectiv-ity patterns in large neural networks. A distributed processor/memory organization is developed for eeciently simulating asynchronous, value-passing connection-ist models. Based on the network connectivity and mapping policy, we estimate t...

متن کامل

Mapping Neural Networks onto Message - Passing

This paper investigates the architectural requirements for simulating neural networks using massively parallel multiprocessors. First, we model the connectivity patterns in large neural networks. A distributed processor/memory organization is developed for efficiently simulating asynchronous, value-passing connectionist models. On the basis of the network connectivity and mapping policy, we est...

متن کامل

Low-rank matrix reconstruction and clustering via approximate message passing

We study the problem of reconstructing low-rank matrices from their noisy observations. We formulate the problem in the Bayesian framework, which allows us to exploit structural properties of matrices in addition to low-rankedness, such as sparsity. We propose an efficient approximate message passing algorithm, derived from the belief propagation algorithm, to perform the Bayesian inference for...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neural Networks

سال: 2021

ISSN: ['1879-2782', '0893-6080']

DOI: https://doi.org/10.1016/j.neunet.2021.02.025